翻訳と辞書
Words near each other
・ Least puffer
・ Least pygmy squirrel
・ Least restrictive environment
・ Least sandpiper
・ Least seedsnipe
・ Least shrew tenrec
・ Least significant bit
・ Least slack time scheduling
・ Least soft-furred mouse
・ Least squares
・ Least squares adjustment
・ Least squares conformal map
・ Least squares inference in phylogeny
・ Learning Tree International
・ Learning Unlimited
Learning vector quantization
・ Learning with errors
・ Learning with FuzzyWOMP
・ Learning with Leeper
・ Learning-by-doing
・ Learning-by-doing (economics)
・ LearningNI
・ LearningRx
・ Learnit Institute of Business and Technology
・ Learnium International School
・ Learnmore Jongwe
・ Learnscapes
・ LearnShare
・ Learnshift India
・ LearnSocial


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Learning vector quantization : ウィキペディア英語版
Learning vector quantization
In computer science, learning vector quantization (LVQ), is a prototype-based supervised classification algorithm. LVQ is the supervised counterpart of vector quantization systems.
== Overview ==
LVQ can be understood as a special case of an artificial neural network, more precisely, it applies a winner-take-all Hebbian learning-based approach. It is a precursor to Self-organizing maps (SOM) and related to Neural gas, and to the k-Nearest Neighbor algorithm (k-NN). LVQ was invented by Teuvo Kohonen.〔T. Kohonen. Self-Organizing Maps. Springer, Berlin, 1997.〕
An LVQ system is represented by prototypes W=(w(i),...,w(n)) which are defined in the feature space of observed data. In
winner-take-all training algorithms one determines, for each data point, the prototype which is closest to the input according to a given distance measure. The position of this so-called winner prototype is then adapted, i.e. the winner is moved closer if it correctly classifies the data point or moved away if it classifies the data point incorrectly.
An advantage of LVQ is that it creates prototypes that are easy to interpret for experts in the respective application domain.〔T. Kohonen. Learning vector quantization. In: M.A. Arbib, editor, The Handbook of Brain Theory
and Neural Networks., pages 537–540. MIT Press, Cambridge, MA, 1995.〕
LVQ systems can be applied to multi-class classification problems in a natural way.
It is used in a variety of practical applications, see http://liinwww.ira.uka.de/bibliography/Neural/SOM.LVQ.html
for an extensive bibliography.
A key issue in LVQ is the choice of an appropriate measure of distance or similarity for training and classification. Recently, techniques have been developed which adapt a parameterized distance measure in the course of training the system, see e.g. (Schneider, Biehl, and Hammer, 2009) 〔P. Schneider, B. Hammer, and M. Biehl. Adaptive Relevance Matrices in Learning Vector Quantization.Neural Computation 21: 3532–3561, 2009.
http://www.mitpressjournals.org/doi/abs/10.1162/neco.2009.10-08-892〕 and references therein.
LVQ can be a source of great help in classifying text documents.〔Fahad and Sikander. Classification of textual documents using learning vector quantization. Information Technology Journal 6.1 (2007): 154-159.
http://198.170.104.138/itj/2007/154-159.pdf〕

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Learning vector quantization」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.